10 - Beyond the Patterns - Markus Haltmeier - Learned Analysis and Synthesis Regularisation of Inverse Problems [ID:27843]
50 von 1091 angezeigt

Welcome back to Beyond the Patterns. So happy new year 2021. You can see we're still in

lockdown in Germany so all the hairdressers are closed and it's kind of difficult to get by.

But this doesn't mean that we can't continue to have good signs and good science talks.

So this is why I invited another speaker to our small round here and today I'm very proud to be

able to present Markus Haltmeier. So he received his PhD in mathematics from the University of

Innsbruck, Tyrol, Austria in 2007 for his work on computer tomography. Then he worked as a researcher

at the University of Innsbruck, the University of Vienna, Austria and the Max Planck Institute for

biophysical chemistry in Göttingen, Germany on various aspects of inverse problems. Since 2012

he is a full professor in the department of mathematics, University of Innsbruck. He is

currently interested in inverse problems, regularization theory, signal processing and

image processing, computer tomography, photoacoustic imaging and machine learning. So today

it's a great pleasure to announce Markus and his presentation today is entitled

learned analysis and synthesis regularization of inverse problems.

And Markus, glad to have you here and the stage is yours.

Thank you Andreas for this nice introduction and the invitation to give a talk here.

And yeah, so and yeah I will start and my the talk is already said so we'd be about and as you can

see here on the title it's about learned analysis and synthesis regularization of inverse problems.

So I kind of I will give, since we have sufficiently time as I heard, so now my talk will in general be

kind of a bit broad and only in the last part of the talk then I will really discuss

some learned versions of analysis and synthesis regularization and I will motivate the need for

regularization. I hopefully explain what I understand and as analysis and synthesis regularization

and yes and then kind of give some examples and some results. It's also kind of a mathematical

talk even I do not really go into details much. So basically the outline as I explained so I will

give some kind of long introduction where I recall probably all of you are familiar with

inverse problems nevertheless I give a brief introduction to inverse problems and in particular

to the ill-posedness of inverse problems so kind of why there is a need for regularization methods.

Then I will also recall what a regularization method is and basically also give you that

definition in a kind of particular simple situation. A simple situation with the Moore-Pendro's inverse

which of course for frame-based regularization kind of the intention is kind of to have a bit

a more general setting nevertheless I will call kind of recall these basics in the elementary setting.

Then the second part which could be kind of considered maybe this second part two and three

are main parts. I will discuss frame-based regularization where we use the analysis

and synthesis model so and kind of in this case analysis and synthesis operators giving

kind of the prior information they are kind of linear and so somehow you could think of it

kind of forming frames. Yes and in section three then we try to give at least some recall some

approaches we recently used to kind of to use have kind of learned analysis and synthesis

regularization where instead of frames then these operators are taken as neural networks.

And also some learning strategy there would be of course many possible alternatives and other

and other strategies and I intended initially to give an overview on the literature which

related methods but at the end I decided not to do it because of course it's always too

it's a bit difficult to have a really broad and fair overview so therefore I still focus on

the approach and methods we used in order not to somehow omit something others did okay as we

omitted than most. Okay and I will start with now as I said with an introduction so I will recall

so what type of problems we want to solve and kind of how we many people formulate and think of

inverse problems. So kind of the problem the approach is kind of here this standard formulation

and hopefully you still see my mouse so the standard formulation of an inverse problem so which

is kind of the problem of approximating or basically recovering the unknown u given data k of u where

k is the forward model and k of u's added some perturbation z. I said perturbation sometimes

usually I will call it noise then I will k u plus z equals y that's the noisy data

and from this noisy data our aim is to recover

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

02:07:13 Min

Aufnahmedatum

2021-01-13

Hochgeladen am

2021-01-14 02:09:17

Sprache

en-US

It’s a great pleasure to welcome Prof. Dr. Markus Haltmeier for an invited talk on Image Reconstruction.

Abstract: Inverse problems consist of finding accurate approximations for the unknown. Unknown x from noisy data y = A (x) + b, where A is the so-called forward operator, b represents the noise (data perturbation) and y are given noisy data. The characteristic property of inverse problems is their ill-posedness, which means that A(x) = y has no unique solution or the solution depends unstably on the given data. To obtain stable and accurate solutions, regularisation methods incorporate additional available information about the unknown and the noise.  In this talk, we review classical frame-based analysis and synthesis regularisation. We then present recent extensions that use neural networks as nonlinear synthesis and analysis operators. A mathematical analysis is given, possible training strategies are discussed and connections to related work are presented. This talk is based on:
[1] H. Li, J. Schwab, S. Antholzer, M. Haltmeier, M. NETT: Solving inverse problems with deep neural networks. Inverse Problems 36, 2020.
[2] D. Obmann, L. Nguyen, J. Schwab, M. Haltmeier. Sparse l^q-regularization of inverse problems with deep learning, arXiv:1908.03006 [math.NA], 2020.
[3] D. Obmann, J. Schwab, M. Haltmeier. Deep synthesis regularization of inverse problems, Inverse Problems 37, 2021.

Short Bio: Markus Haltmeier received his Ph.D. in mathematics from the University of Innsbruck, Tyrol, Austria, in 2007 for his work on computed tomography. He then worked as a researcher at the University of Innsbruck, the University of Vienna, Austria, and the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany, on various aspects of inverse problems. Since 2012, he is a full professor at the Department of Mathematics, University of Innsbruck. His current research interests include inverse problems, regularisation theory, signal and image processing, computed tomography, photoacoustic imaging, and machine learning.

This video is released under CC BY 4.0. Please feel free to share and reuse.

For reminders to watch the new video follow on Twitter or LinkedIn. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups.

Music Reference: 
Damiano Baldoni - Thinking of You (Intro)
Damiano Baldoni - Poenia (Outro)

Tags

beyond the patterns
Einbetten
Wordpress FAU Plugin
iFrame
Teilen